3,417 research outputs found

    Hierarchic Superposition Revisited

    Get PDF
    Many applications of automated deduction require reasoning in first-order logic modulo background theories, in particular some form of integer arithmetic. A major unsolved research challenge is to design theorem provers that are "reasonably complete" even in the presence of free function symbols ranging into a background theory sort. The hierarchic superposition calculus of Bachmair, Ganzinger, and Waldmann already supports such symbols, but, as we demonstrate, not optimally. This paper aims to rectify the situation by introducing a novel form of clause abstraction, a core component in the hierarchic superposition calculus for transforming clauses into a form needed for internal operation. We argue for the benefits of the resulting calculus and provide two new completeness results: one for the fragment where all background-sorted terms are ground and another one for a special case of linear (integer or rational) arithmetic as a background theory

    Large time dynamics and aging of a polymer chain in a random potential

    Full text link
    We study the out-of-equilibrium large time dynamics of a gaussian polymer chain in a quenched random potential. The dynamics studied is a simple Langevin dynamics commonly referred to as the Rouse model. The equations for the two-time correlation and response function are derived within the gaussian variational approximation. In order to implement this approximation faithfully, we employ the supersymmetric representation of the Martin-Siggia-Rose dynamical action. For a short ranged correlated random potential the equations are solved analytically in the limit of large times using certain assumptions concerning the asymptotic behavior. Two possible dynamical behaviors are identified depending upon the time separation- a stationary regime and an aging regime. In the stationary regime time translation invariance holds and so is the fluctuation dissipation theorem. The aging regime which occurs for large time separations of the two-time correlation functions is characterized by history dependence and the breakdown of certain equilibrium relations. The large time limit of the equations yields equations among the order parameters that are similar to the equations obtained in the statics using replicas. In particular the aging solution corresponds to the broken replica solution. But there is a difference in one equation that leads to important consequences for the solution. The stationary regime corresponds to the motion of the polymer inside a local minimum of the random potential, whereas in the aging regime the polymer hops between different minima. As a byproduct we also solve exactly the dynamics of a chain in a random potential with quadratic correlations.Comment: 21 pages, RevTeX

    Replica field theory for a polymer in random media

    Full text link
    In this paper we revisit the problem of a (non self-avoiding) polymer chain in a random medium which was previously investigated by Edwards and Muthukumar (EM). As noticed by Cates and Ball (CB) there is a discrepancy between the predictions of the replica calculation of EM and the expectation that in an infinite medium the quenched and annealed results should coincide (for a chain that is free to move) and a long polymer should always collapse. CB argued that only in a finite volume one might see a ``localization transition'' (or crossover) from a stretched to a collapsed chain in three spatial dimensions. Here we carry out the replica calculation in the presence of an additional confining harmonic potential that mimics the effect of a finite volume. Using a variational scheme with five variational parameters we derive analytically for d<4 the result R~(g |ln \mu|)^{-1/(4-d)} ~(g lnV)^{-1/(4-d)}, where R is the radius of gyration, g is the strength of the disorder, \mu is the spring constant associated with the confining potential and V is the associated effective volume of the system. Thus the EM result is recovered with their constant replaced by ln(V) as argued by CB. We see that in the strict infinite volume limit the polymer always collapses, but for finite volume a transition from a stretched to a collapsed form might be observed as a function of the strength of the disorder. For d<2 and for large V>V'~exp[g^(2/(2-d))L^((4-d)/(2-d))] the annealed results are recovered and R~(Lg)^(1/(d-2)), where L is the length of the polymer. Hence the polymer also collapses in the large L limit. The 1-step replica symmetry breaking solution is crucial for obtaining the above results.Comment: Revtex, 32 page

    Localization of a polymer in random media: Relation to the localization of a quantum particle

    Full text link
    In this paper we consider in detail the connection between the problem of a polymer in a random medium and that of a quantum particle in a random potential. We are interested in a system of finite volume where the polymer is known to be {\it localized} inside a low minimum of the potential. We show how the end-to-end distance of a polymer which is free to move can be obtained from the density of states of the quantum particle using extreme value statistics. We give a physical interpretation to the recently discovered one-step replica-symmetry-breaking solution for the polymer (Phys. Rev. E{\bf 61}, 1729 (2000)) in terms of the statistics of localized tail states. Numerical solutions of the variational equations for chains of different length are performed and compared with quenched averages computed directly by using the eigenfunctions and eigenenergies of the Schr\"odinger equation for a particle in a one-dimensional random potential. The quantities investigated are the radius of gyration of a free gaussian chain, its mean square distance from the origin and the end-to-end distance of a tethered chain. The probability distribution for the position of the chain is also investigated. The glassiness of the system is explained and is estimated from the variance of the measured quantities.Comment: RevTex, 44 pages, 13 figure

    The Infocus Hard X-ray Telescope: Pixellated CZT Detector/Shield Performance and Flight Results

    Get PDF
    The CZT detector on the Infocus hard X-ray telescope is a pixellated solid-state device capable of imaging spectroscopy by measuring the position and energy of each incoming photon. The detector sits at the focal point of an 8m focal length multilayered grazing incidence X-ray mirror which has significant effective area between 20--40 keV. The detector has an energy resolution of 4.0keV at 32keV, and the Infocus telescope has an angular resolution of 2.2 arcminute and a field of view of about 10 arcminutes. Infocus flew on a balloon mission in July 2001 and observed Cygnus X-1. We present results from laboratory testing of the detector to measure the uniformity of response across the detector, to determine the spectral resolution, and to perform a simple noise decomposition. We also present a hard X-ray spectrum and image of Cygnus X-1, and measurements of the hard X-ray CZT background obtained with the SWIN detector on Infocus.Comment: To appear in the proceedings of the SPIE conference "Astronomical Telescopes and Instrumentation", #4851-116, Kona, Hawaii, Aug. 22-28, 2002. 12 pages, 9 figure

    Two-qutrit Entanglement Witnesses and Gell-Mann Matrices

    Full text link
    The Gell-Mann λ\lambda matrices for Lie algebra su(3) are the natural basis for the Hilbert space of Hermitian operators acting on the states of a three-level system(qutrit). So the construction of EWs for two-qutrit states by using these matrices may be an interesting problem. In this paper, several two-qutrit EWs are constructed based on the Gell-Mann matrices by using the linear programming (LP) method exactly or approximately. The decomposability and non-decomposability of constructed EWs are also discussed and it is shown that the λ\lambda-diagonal EWs presented in this paper are all decomposable but there exist non-decomposable ones among λ\lambda-non-diagonal EWs.Comment: 25 page

    Climate, soil or both? Which variables are better predictors of the distributions of Australian shrub species?

    Get PDF
    © 2017 Hageer et al. Background. Shrubs play a key role in biogeochemical cycles, prevent soil and water erosion, provide forage for livestock, and are a source of food, wood and non-wood products. However, despite their ecological and societal importance, the influence of different environmental variables on shrub distributions remains unclear.Weevaluated the influence of climate and soil characteristics, and whether including soil variables improved the performance of a species distribution model (SDM), Maxent. Methods. This study assessed variation in predictions of environmental suitability for 29 Australian shrub species (representing dominant members of six shrubland classes) due to the use of alternative sets of predictor variables. Models were calibrated with (1) climate variables only, (2) climate and soil variables, and (3) soil variables only. Results. The predictive power of SDMs differed substantially across species, but generally models calibrated with both climate and soil data performed better than those calibrated only with climate variables. Models calibrated solely with soil variables were the least accurate. We found regional differences in potential shrub species richness across Australia due to the use of different sets of variables. Conclusions. Our study provides evidence that predicted patterns of species richness may be sensitive to the choice of predictor set when multiple, plausible alternatives exist, and demonstrates the importance of considering soil properties when modeling availability of habitat for plants

    Intermediate Element Abundances in Galaxy Clusters

    Full text link
    We present the average abundances of the intermediate elements obtained by performing a stacked analysis of all the galaxy clusters in the archive of the X-ray telescope ASCA. We determine the abundances of Fe, Si, S, and Ni as a function of cluster temperature (mass) from 1--10 keV, and place strong upper limits on the abundances of Ca and Ar. In general, Si and Ni are overabundant with respect to Fe, while Ar and Ca are very underabundant. The discrepancy between the abundances of Si, S, Ar, and Ca indicate that the alpha-elements do not behave homogeneously as a single group. We show that the abundances of the most well-determined elements Fe, Si, and S in conjunction with recent theoretical supernovae yields do not give a consistent solution for the fraction of material produced by Type Ia and Type II supernovae at any temperature or mass. The general trend is for higher temperature clusters to have more of their metals produced in Type II supernovae than in Type Ias. The inconsistency of our results with abundances in the Milky Way indicate that spiral galaxies are not the dominant metal contributors to the intracluster medium (ICM). The pattern of elemental abundances requires an additional source of metals beyond standard SNIa and SNII enrichment. The properties of this new source are well matched to those of Type II supernovae with very massive, metal-poor progenitor stars. These results are consistent with a significant fraction of the ICM metals produced by an early generation of population III stars.Comment: 18 pages, 11 figures, 7 tables. Submitted to Ap

    Intelligent Self-Repairable Web Wrappers

    Get PDF
    The amount of information available on the Web grows at an incredible high rate. Systems and procedures devised to extract these data from Web sources already exist, and different approaches and techniques have been investigated during the last years. On the one hand, reliable solutions should provide robust algorithms of Web data mining which could automatically face possible malfunctioning or failures. On the other, in literature there is a lack of solutions about the maintenance of these systems. Procedures that extract Web data may be strictly interconnected with the structure of the data source itself; thus, malfunctioning or acquisition of corrupted data could be caused, for example, by structural modifications of data sources brought by their owners. Nowadays, verification of data integrity and maintenance are mostly manually managed, in order to ensure that these systems work correctly and reliably. In this paper we propose a novel approach to create procedures able to extract data from Web sources -- the so called Web wrappers -- which can face possible malfunctioning caused by modifications of the structure of the data source, and can automatically repair themselves.\u
    • …
    corecore